Fast Tree-Structured Recursive Neural Tensor Networks

نویسندگان

  • Anand Avati
  • Nai-Chia Chen
  • Youssef Ahres
چکیده

In this project we explore different ways in which we can optimize the computation of training a Tree-structured RNTN, in particular batching techniques in combining many matrix-vector multiplications into matrix-matrix multiplications, and many tensor-vector operations into tensor-matrix operations. We assume that training is performed using mini-batch AdaGrad algorithm, and explore how we can exploit the presence of multiple examples and batch computation across the set as a whole. We explore how we can apply our optimization techniques to the forward propagation phase, the back propagation phase, and run the batched operations on GPUs. Our goal is to speed up the execution of Tree-structured RNNs so that its runtime performance is no more a limiting factor for adoption.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Recurrent networks for structured data - A unifying approach and its properties

We consider recurrent neural networks which deal with symbolic formulas, terms, or, generally speaking, tree-structured data. Approaches like the recursive autoassociative memory, discrete-time recurrent networks, folding networks, tensor construction, holographic reduced representations, and recursive reduced descriptions fall into this category. They share the basic dynamics of how structured...

متن کامل

Recursive Neural Networks Can Learn Logical Semantics

Tree-structured recursive neural networks (TreeRNNs) for sentence meaning have been successful for many applications, but it remains an open question whether the fixed-length representations that they learn can support tasks as demanding as logical deduction. We pursue this question by evaluating whether two such models— plain TreeRNNs and tree-structured neural tensor networks (TreeRNTNs)—can ...

متن کامل

A preliminary empirical comparison of recursive neural networks and tree kernel methods on regression tasks for tree structured domains

The aim of this paper is to start a comparison between Recursive Neural Networks (RecNN) and kernel methods for structured data, specifically Support Vector Regression (SVR) machine using a Tree Kernel, in the context of regression tasks for trees. Both the approaches can deal directly with a structured input representation and differ in the construction of the feature space from structured dat...

متن کامل

Tree-Structured Composition in Neural Networks without Tree-Structured Architectures

Tree-structured neural networks encode a particular tree geometry for a sentence in the network design. However, these models have at best only slightly outperformed simpler sequence-based models. We hypothesize that neural sequence models like LSTMs are in fact able to discover and implicitly use recursive compositional structure, at least for tasks with clear cues to that structure in the dat...

متن کامل

Conquering vanishing gradient: Tensor Tree LSTM on aspect-sentiment classification

Our project focus on the problem of aspect specific sentiment analysis using recursive neural networks. Different from the previous studies where labels exist on every node of constituency tree, we have only one label each sentence, which is only on the root node, and it causes a severe vanishing gradient problem for both RNN and RNTN. To deal with such problem, we develop a classification algo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015